title slide
What do Printed Circuit Board simulations...
subsurface modelling...
Electrical Impedance Tomography...
Composite materials...
Proton Exchange Membranes...
We want to find the solution u. To do so we discretize...
we get a linear system..
we get a linear system..
We solve this system using Conjugate Gradient method...
Table of Contents
Explain CG as an iterative method for solving Ax=b
We provide an initial guess...
which gives an initial residual.
We specify a desired error tolerance and feed everything into CG...
which runs the following algorithm...
We recognize the initial residual and error tolerance as inputs. Also note the error at the jth iteration.
CG outputs the jth approximation and the corresponding error.
with errors decreasing as j increases.
The classical CG error bound relates the error to the condition number kappa(A) and the number of iterations m.
From this we can derive a classical iteration bound...
The problem is that for high-contrast problems, kappa is very large. Leading to very pessimistic bounds on m.
This is the main research question of this thesis.
Explain CGs dependence on eigenvalues
That is travelling...
We can decompose this signal into its frequency components...
Returning to our original system...
We can do a similar decomposition of A resulting in its eigenvalues or spectrum...
Why is this important? To see this, we look at the CG solution polynomial...
Which is related to the residual polynomial...
The CG error can be bounded in terms of the residual polynomial...
which can recover the classical bound by assuming a uniform distribution of eigenvalues...
We can now look at different spectra...
...Lets look at a uniform spectrum now.
For a uniform spectrum, the classical bound is sharp.
For a split spectrum, the classical bound is pessimistic.
This is the main research question of this thesis.
Recap: High-contrast leads to
high values for kappa and thus pessimistic bound m_1 on m.
The problem of large condition numbers can be mitigated by using preconditioning. We transform the system...
Lets look at two specific examples of coefficient functions.
which we will refer to as M1, M2, and M3.
All three preconditioners resulted in similar condition numbers...
However the number of CG iterations varied significantly...
So we need to refine our research question further.
We recap the classical CG error bound...
which can be solved using the Chebyshev polynomial...
actually, a scaled Chebyshev polynomial...
But what if we reintroduce the full spectrum of A?
we consider the simplest, non-trivial case of two-cluster spectra.
How can we solve the min-max problem in this case?
We use the product of two scaled Chebyshev polynomials.
This leads to the two-cluster CG iteration bound from Axelsson (1976).
We actually want to develop the most general bound...
leading to the most general CG iteration bound.
This is a simplified view of the general CG iteration bound.
But how do we get from the spectrum of A to a set of clusters?
We start by assuming no knowledge of the cluster boundaries...
we find the largest relative gap in the spectrum...
we find the largest relative gap in the spectrum.
If this condition is satisfied, we perform recursion on the two partitions.
This results in two partitions.
This results in two partitions.
This results in two partitions.
This results in two partitions.
So we need to refine our research question further.
Discuss results on absolute sharpness of the new bounds
Show other meshes
Summarize the main points and contributions of the work
References